Discussion of “ Least Angle Regression ” by Efron

نویسنده

  • Sanford Weisberg
چکیده

Most of this article concerns the uses of LARS and the two related methods in the age-old, " somewhat notorious, " problem of " [a]utomatic model-building algorithms.. . " for linear regression. In the following, I will confine my comments to this notorious problem and to the use of LARS and its relatives to solve it. 1. The implicit assumption. Suppose the response is y, and we collect the m predictors into a vector x, the realized data into an n × m matrix X and the response is the n-vector Y. If P is the projection onto the column space of (1, X), then LARS, like ordinary least squares (OLS), assumes that, for the purposes of model building, Y can be replaced byˆY = P Y without loss of information. In large samples, this is equivalent to the assumption that the conditional distributions F (y|x) can be written as F (y|x) = F (y|x ′ β) (1.1) for some unknown vector β. Efron, Hastie, Johnstone and Tibshirani use this assumption in the definition of the LARS algorithm and in estimating residual variance byˆσ 2 = (I − P)Y 2 /(n − m − 1). For LARS to be reasonable , we need to have some assurance that this particular assumption holds or that it is relatively benign. If this assumption is not benign, then LARS like OLS is unlikely to produce useful results. A more general alternative to (1.1) is F (y|x) = F (y|x ′ B), (1.2) where B is an m × d rank d matrix. The smallest value of d for which (1.2) holds is called the structural dimension of the regression problem [Cook (1998)]. An obvious precursor to fitting linear regression is deciding on the structural dimension, not proceeding as if d = 1. For the diabetes data used

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Discussion of “ Least Angle Regression ” by Efron

Algorithms for simultaneous shrinkage and selection in regression and classification provide attractive solutions to knotty old statistical challenges. Nevertheless, as far as we can tell, Tibshirani’s Lasso algorithm has had little impact on statistical practice. Two particular reasons for this may be the relative inefficiency of the original Lasso algorithm and the relative complexity of more...

متن کامل

Discussion of Least Angle Regression

Algorithms for simultaneous shrinkage and selection in regression and classification provide attractive solutions to knotty old statistical challenges. Nevertheless, as far as we can tell, Tibshirani’s Lasso algorithm has had little impact on statistical practice. Two particular reasons for this may be the relative inefficiency of the original Lasso algorithm, and the relative complexity of mor...

متن کامل

Discussion of “ Least Angle Regression ” by Efron

I have enjoyed reading the work of each of these authors over the years, so it is a real pleasure to have this opportunity to contribute to the discussion of this collaboration. The geometry of LARS furnishes an elegant bridge between the Lasso and Stagewise regression, methods that I would not have suspected to be so related. Toward my own interests, LARS offers a rather different way to const...

متن کامل

Forward stagewise regression and the monotone lasso

Abstract: We consider the least angle regression and forward stagewise algorithms for solving penalized least squares regression problems. In Efron, Hastie, Johnstone & Tibshirani (2004) it is proved that the least angle regression algorithm, with a small modification, solves the lasso regression problem. Here we give an analogous result for incremental forward stagewise regression, showing tha...

متن کامل

An ordinary differential equation based solution path algorithm.

Efron, Hastie, Johnstone and Tibshirani (2004) proposed Least Angle Regression (LAR), a solution path algorithm for the least squares regression. They pointed out that a slight modification of the LAR gives the LASSO (Tibshirani, 1996) solution path. However it is largely unknown how to extend this solution path algorithm to models beyond the least squares regression. In this work, we propose a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004